Defining coverage of an operational domain using a modified nearest-neighbor metric
نویسندگان
چکیده
منابع مشابه
Adaptive Metric nearest Neighbor Classification
Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose a locally adaptive nearest neighbor classification method to try to minimize bias. We use a Chisqu...
متن کاملA Modified K-Nearest Neighbor Algorithm Using Feature Optimization
A classification technique is an organized approach for building classification model from given input dataset. The learning algorithm of each technique is employed to build a model used to find the relationship between attribute set and class label of the given input data. Presence of irrelevant information in the data set reduces the speed and quality of learning. The technique of feature sel...
متن کاملA Modified Editing k-nearest Neighbor Rule
Classification of objects is an important area in a variety of fields and applications. Many different methods are available to make a decision in those cases. The knearest neighbor rule (k-NN) is a well-known nonparametric decision procedure. Classification rules based on the k-NN have already been proposed and applied in diverse substantive areas. The editing k-NN proposed by Wilson would be ...
متن کاملNearest-Neighbor Searching and Metric Space Dimensions
Given a set S of points in a metric space with distance function D, the nearest-neighbor searching problem is to build a data structure for S so that for an input query point q, the point s ∈ S that minimizes D(s, q) can be found quickly. We survey approaches to this problem, and its relation to concepts of metric space dimension. Several measures of dimension can be estimated using nearest-nei...
متن کاملAdaptive Kernel Metric Nearest Neighbor Classification
Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-ofdimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mechanical Systems and Signal Processing
سال: 2015
ISSN: 0888-3270
DOI: 10.1016/j.ymssp.2014.05.040